Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives Produced as Part of the Esprit Working Group in Neural and Computational Learning, Neurocolt 8556

نویسندگان

  • Kurt Hornik
  • Maxwell Stinchcombe
  • Halbert White
  • Peter Auer
چکیده

Recently Barron (1993) has given rates for hidden layer feedforward networks with sigmoid activation functions approximating a class of functions satisfying a certain smoothness condition. These rates do not depend on the dimension of the input space. We extend Barron's results to feed-forward networks with possibly non-sigmoid activation functions approximating mappings and their derivatives simultaneously. Our conditions are similar but not identical to Barron's, but we obtain the same rates of approximation, showing that the approximation error decreases at rates as fast as n ? 1 2 , where n is the number of hidden units. The dimension of the input space appears only in the constants of our bounds.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probabilistic Analysis of Learning in Artiicial Neural Networks: the Pac Model and Its Variants Produced as Part of the Esprit Working Group in Neural and Computational Learning, Neurocolt 8556

1 1 A version of this is to appear as a chapter in The Computational and Learning Complexity of Neural Networks (ed. Ian Parberry), MIT Press. 2 Abstract There are a number of mathematical approaches to the study of learning and generalization in artiicial neural networks. Here we survey thèprobably approximately correct' (PAC) model of learning and some of its variants. These models, much-stud...

متن کامل

Computing the Maximum Bichromatic Discrepancy, with Applications to Computer Graphics and Machine Learning Produced as Part of the Esprit Working Group in Neural and Computational Learning, Neurocolt 8556

Computing the maximum bichromatic discrepancy is an interesting theoretical problem with important applications in computational learning theory, computational geometry and computer graphics. In this paper we give algorithms to compute the maximum bichromatic discrepancy for simple geometric ranges, including rectangles and halfspaces. In addition, we give extensions to other discrepancy problems.

متن کامل

A General Feedforward Neural Network Model C Edric Gegout 123 Produced as Part of the Esprit Working Group in Neural and Computational Learning, Neurocolt 8556

In this paper, we generalize a model proposed by L eon Bottou and Patrick Gallinari in 3]. This model gives a general mathematical description of feedforward neural networks, for which standard models, such as Multi-Layer Perceptrons or Radial Basis Function based neural networks, are only particular cases. A generalized back-propagation, which gives an eecient way to compute the diierential of...

متن کامل

Techniques in Neural Learning Produced as Part of the Esprit Working Group in Neural and Computational Learning, Neurocolt 8556

This paper takes ideas developed in a theoretical framework by Maass 8] and adapts them for a practical learning algorithm for feedforward sigmoid neural networks. A number of diierent techniques are presented which are based loosely around the common theme of taking advantage of the lin-earity of the net input to a neuron, or in other words the fact that there is only a single non-linearity pe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995